Asymptotic and non-asymptotic convergence properties of stochastic approximation with controlled Markov noise without ensuring stability
نویسندگان
چکیده
This paper talks about both the asymptotic and non-asymptotic convergence properties of stochastic approximation algorithms with controlled Markov noise when stability of the iterates (which is an important condition for almost sure convergence) is hard to prove. We achieve the same by giving a lower bound on the lock-in probability of such frameworks i.e. the probability of convergence to a specific attractor of the o.d.e. limit given that the iterates visit its domain of attraction after a sufficiently large number of iterations n0. With the more general assumption of controlled Markov noise supported on a bounded subset of the Euclidean space, we recover the same bound available in literature for the case of controlled i.i.d noise (i.e. martingale difference noise). We use these results to prove almost sure convergence of the iterates to the specified attractor when sufficient conditions to guarantee asymptotic tightness of the iterates are satisfied. Another important corollary of our results is a statement similar to the well-known Kushner-Clarke Lemma for stochastic approximation algorithm with controlled Markov noise, however without the assumption of the stability of the iterates. This is, for step-size sequence { 1 nk }, 1 2 < k ≤ 1 and 1 n(logn)k , k ≤ 1, the almost sure convergence to a local attractor provided that the iterates belong to some open set infinitely often such that the open set contains the local attractor and its compact closure is in the domain of attraction of the local attractor. Additionally, we show that such results can be used to derive a sample complexity estimate of such recursions which is then used in predicting the optimal step-size.
منابع مشابه
Two Time-Scale Stochastic Approximation with Controlled Markov Noise and Off-Policy Temporal-Difference Learning
We present for the first time an asymptotic convergence analysis of two-timescale stochastic approximation driven by controlled Markov noise. In particular, both the faster and slower recursions have non-additive Markov noise components in addition to martingale difference noise. We analyze the asymptotic behavior of our framework by relating it to limiting differential inclusions in both times...
متن کاملContinuous dependence on coefficients for stochastic evolution equations with multiplicative Levy Noise and monotone nonlinearity
Semilinear stochastic evolution equations with multiplicative L'evy noise are considered. The drift term is assumed to be monotone nonlinear and with linear growth. Unlike other similar works, we do not impose coercivity conditions on coefficients. We establish the continuous dependence of the mild solution with respect to initial conditions and also on coefficients. As corollaries of ...
متن کاملApproximation of stochastic advection diffusion equations with finite difference scheme
In this paper, a high-order and conditionally stable stochastic difference scheme is proposed for the numerical solution of $rm Ithat{o}$ stochastic advection diffusion equation with one dimensional white noise process. We applied a finite difference approximation of fourth-order for discretizing space spatial derivative of this equation. The main properties of deterministic difference schemes,...
متن کاملStochastic Approximation: A Survey
Stochastic recursive algorithms, also known as stochastic approximation, take many forms and have numerous applications. It is the asymptotic properties that are of interest. The early history, starting with the work of Robbins and Monro, is discussed. An approach to proofs of convergence with probability one is illustrated by a stability-type argument. For general noise processes and algorithm...
متن کاملAsymptotic Properties of Two Time-Scale Stochastic Approximation Algorithms with Constant Step Sizes
Asymptotic properties of two time-scale stochastic approximation algorithms with constant step sizes are analyzed in this paper. The analysis is carried out for the algorithms with additive noise, as well as for the algorithms with non-additive noise. The algorithms with additive noise are considered for the case where the noise is state-dependent and admits the decomposition as a sum of a mart...
متن کامل